Degree and Principal Eigenvectors in Complex Networks

نویسندگان

  • Cong Li
  • Huijuan Wang
  • Piet Van Mieghem
چکیده

The largest eigenvalue λ1 of the adjacency matrix powerfully characterizes dynamic processes on networks, such as virus spread and synchronization. The minimization of the spectral radius by removing a set of links (or nodes) has been shown to be an NP-complete problem. So far, the best heuristic strategy is to remove links/nodes based on the principal eigenvector corresponding to the largest eigenvalue λ1. This motivates us to investigate properties of the principal eigenvector x1 and its relation with the degree vector. (a) We illustrate and explain why the average E[x1] decreases with the linear degree correlation coeffi cient ρD in a network with a given degree vector; (b) The difference between the principal eigenvector and the scaled degree vector is proved to be the smallest, when λ1 = N2 N1 , where Nk is the total number walks in the network with k hops; (c) The correlation between the principal eigenvector and the degree vector decreases when the degree correlation ρD is decreased.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

EIGENVECTORS OF COVARIANCE MATRIX FOR OPTIMAL DESIGN OF STEEL FRAMES

In this paper, the discrete method of eigenvectors of covariance matrix has been used to weight minimization of steel frame structures. Eigenvectors of Covariance Matrix (ECM) algorithm is a robust and iterative method for solving optimization problems and is inspired by the CMA-ES method. Both of these methods use covariance matrix in the optimization process, but the covariance matrix calcula...

متن کامل

Compression of Breast Cancer Images By Principal Component Analysis

The principle of dimensionality reduction with PCA is the representation of the dataset ‘X’in terms of eigenvectors ei ∈ RN  of its covariance matrix. The eigenvectors oriented in the direction with the maximum variance of X in RN carry the most      relevant information of X. These eigenvectors are called principal components [8]. Ass...

متن کامل

Compression of Breast Cancer Images By Principal Component Analysis

The principle of dimensionality reduction with PCA is the representation of the dataset ‘X’in terms of eigenvectors ei ∈ RN  of its covariance matrix. The eigenvectors oriented in the direction with the maximum variance of X in RN carry the most      relevant information of X. These eigenvectors are called principal components [8]. Ass...

متن کامل

A Fast Approach to the Detection of All-Purpose Hubs in Complex Networks with Chemical Applications

A novel algorithm for the fast detection of hubs in chemical networks is presented. The algorithm identifies a set of nodes in the network as most significant, aimed to be the most effective points of distribution for fast, widespread coverage throughout the system. We show that our hubs have in general greater closeness centrality and betweenness centrality than vertices with maximal degree, w...

متن کامل

Detection of Fake Accounts in Social Networks Based on One Class Classification

Detection of fake accounts on social networks is a challenging process. The previous methods in identification of fake accounts have not considered the strength of the users’ communications, hence reducing their efficiency. In this work, we are going to present a detection method based on the users’ similarities considering the network communications of the users. In the first step, similarity ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2012